15 research outputs found
Information Loss in the Human Auditory System
From the eardrum to the auditory cortex, where acoustic stimuli are decoded,
there are several stages of auditory processing and transmission where
information may potentially get lost. In this paper, we aim at quantifying the
information loss in the human auditory system by using information theoretic
tools.
To do so, we consider a speech communication model, where words are uttered
and sent through a noisy channel, and then received and processed by a human
listener.
We define a notion of information loss that is related to the human word
recognition rate. To assess the word recognition rate of humans, we conduct a
closed-vocabulary intelligibility test. We derive upper and lower bounds on the
information loss. Simulations reveal that the bounds are tight and we observe
that the information loss in the human auditory system increases as the signal
to noise ratio (SNR) decreases. Our framework also allows us to study whether
humans are optimal in terms of speech perception in a noisy environment.
Towards that end, we derive optimal classifiers and compare the human and
machine performance in terms of information loss and word recognition rate. We
observe a higher information loss and lower word recognition rate for humans
compared to the optimal classifiers. In fact, depending on the SNR, the machine
classifier may outperform humans by as much as 8 dB. This implies that for the
speech-in-stationary-noise setup considered here, the human auditory system is
sub-optimal for recognizing noisy words
Distributed Remote Vector Gaussian Source Coding with Covariance Distortion Constraints
In this paper, we consider a distributed remote source coding problem, where
a sequence of observations of source vectors is available at the encoder. The
problem is to specify the optimal rate for encoding the observations subject to
a covariance matrix distortion constraint and in the presence of side
information at the decoder. For this problem, we derive lower and upper bounds
on the rate-distortion function (RDF) for the Gaussian case, which in general
do not coincide. We then provide some cases, where the RDF can be derived
exactly. We also show that previous results on specific instances of this
problem can be generalized using our results. We finally show that if the
distortion measure is the mean squared error, or if it is replaced by a certain
mutual information constraint, the optimal rate can be derived from our main
result.Comment: This is the final version accepted at ISIT'1
Distributed Remote Vector Gaussian Source Coding for Wireless Acoustic Sensor Networks
In this paper, we consider the problem of remote vector Gaussian source
coding for a wireless acoustic sensor network. Each node receives messages from
multiple nodes in the network and decodes these messages using its own
measurement of the sound field as side information. The node's measurement and
the estimates of the source resulting from decoding the received messages are
then jointly encoded and transmitted to a neighboring node in the network. We
show that for this distributed source coding scenario, one can encode a
so-called conditional sufficient statistic of the sources instead of jointly
encoding multiple sources. We focus on the case where node measurements are in
form of noisy linearly mixed combinations of the sources and the acoustic
channel mixing matrices are invertible. For this problem, we derive the
rate-distortion function for vector Gaussian sources and under covariance
distortion constraints.Comment: 10 pages, to be presented at the IEEE DCC'1
Source Coding in Networks with Covariance Distortion Constraints
We consider a source coding problem with a network scenario in mind, and
formulate it as a remote vector Gaussian Wyner-Ziv problem under covariance
matrix distortions. We define a notion of minimum for two positive-definite
matrices based on which we derive an explicit formula for the rate-distortion
function (RDF). We then study the special cases and applications of this
result. We show that two well-studied source coding problems, i.e. remote
vector Gaussian Wyner-Ziv problems with mean-squared error and mutual
information constraints are in fact special cases of our results. Finally, we
apply our results to a joint source coding and denoising problem. We consider a
network with a centralized topology and a given weighted sum-rate constraint,
where the received signals at the center are to be fused to maximize the output
SNR while enforcing no linear distortion. We show that one can design the
distortion matrices at the nodes in order to maximize the output SNR at the
fusion center. We thereby bridge between denoising and source coding within
this setup